Discrete Gradient Method: Derivative-Free Method for Nonsmooth Optimization
نویسندگان
چکیده
منابع مشابه
A Derivative-free Method for Linearly Constrained Nonsmooth Optimization
This paper develops a new derivative-free method for solving linearly constrained nonsmooth optimization problems. The objective functions in these problems are, in general, non-regular locally Lipschitz continuous function. The computation of generalized subgradients of such functions is difficult task. In this paper we suggest an algorithm for the computation of subgradients of a broad class ...
متن کاملDerivative free optimization method
Derivative free optimization (DFO) methods are typically designed to solve optimization problems whose objective function is computed by a “black box”; hence, the gradient computation is unavailable. Each call to the “black box” is often expensive, so estimating derivatives by finite differences may be prohibitively costly. Finally, the objective function value may be computed with some noise, ...
متن کاملA Simple Proximal Stochastic Gradient Method for Nonsmooth Nonconvex Optimization
We analyze stochastic gradient algorithms for optimizing nonconvex, nonsmooth finite-sum problems. In particular, the objective function is given by the summation of a differentiable (possibly nonconvex) component, together with a possibly non-differentiable but convex component. We propose a proximal stochastic gradient algorithm based on variance reduction, called ProxSVRG+. The algorithm is ...
متن کاملA Linesearch-Based Derivative-Free Approach for Nonsmooth Constrained Optimization
In this paper, we propose new linesearch-based methods for nonsmooth constrained optimization problems when first-order information on the problem functions is not available. In the first part, we describe a general framework for bound-constrained problems and analyze its convergence towards stationary points, using the Clarke-Jahn directional derivative. In the second part, we consider inequal...
متن کاملThe discrete gradient evolutionary strategy method for global optimization
Global optimization problems continue to he a challenge in computational mathematics. The field is progressing in two streams: deterministic and heuristic approaches. In this paper, we present a hybrid method that uses the discrete gradient method. which is a derivative free local search method, and evolutionary strategies. We show that the hybridization of the two methods is better than each o...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Journal of Optimization Theory and Applications
سال: 2007
ISSN: 0022-3239,1573-2878
DOI: 10.1007/s10957-007-9335-5